211 research outputs found

    Bayesian Nonparametric Calibration and Combination of Predictive Distributions

    Get PDF
    We introduce a Bayesian approach to predictive density calibration and combination that accounts for parameter uncertainty and model set incompleteness through the use of random calibration functionals and random combination weights. Building on the work of Ranjan, R. and Gneiting, T. (2010) and Gneiting, T. and Ranjan, R. (2013), we use infinite beta mixtures for the calibration. The proposed Bayesian nonparametric approach takes advantage of the flexibility of Dirichlet process mixtures to achieve any continuous deformation of linearly combined predictive distributions. The inference procedure is based on Gibbs sampling and allows accounting for uncertainty in the number of mixture components, mixture weights, and calibration parameters. The weak posterior consistency of the Bayesian nonparametric calibration is provided under suitable conditions for unknown true density. We study the methodology in simulation examples with fat tails and multimodal densities and apply it to density forecasts of daily S&P returns and daily maximum wind speed at the Frankfurt airport.Comment: arXiv admin note: text overlap with arXiv:1305.2026 by other author

    A Bayesian multi-factor model of instability in prices and quantities of risk in U.S. financial markets

    Get PDF
    This paper analyzes the empirical performance of two alternative ways in which multi-factor models with time-varying risk exposures and premia may be estimated. The first method echoes the seminal two-pass approach advocated by Fama and MacBeth (1973). The second approach extends previous work by Ouysse and Kohn (2010) and is based on a Bayesian approach to modelling the latent process followed by risk exposures and idiosynchratic volatility. Our application to monthly, 1979-2008 U.S. data for stock, bond, and publicly traded real estate returns shows that the classical, two-stage approach that relies on a nonparametric, rolling window modelling of time-varying betas yields results that are unreasonable. There is evidence that all the portfolios of stocks, bonds, and REITs have been grossly over-priced. On the contrary, the Bayesian approach yields sensible results as most portfolios do not appear to have been misspriced and a few risk premia are precisely estimated with a plausibile sign. Real consumption growth risk turns out to be the only factor that is persistently priced throughout the sample.Econometric models ; Stochastic analysis ; Financial markets

    Measuring sovereign contagion in Europe

    Get PDF
    This paper analyzes sovereign risk shift-contagion, i.e. positive and significant changes in the propagation mechanisms, using bond yield spreads for the major eurozone countries. By emphasizing the use oftwo econometric approaches based on quantile regressions (standard quantile regression and Bayesianquantile regression with heteroskedasticity) we find that the propagation of shocks in euro\u2019s bond yieldspreads shows almost no presence of shift-contagion in the sample periods considered (2003\u20132006,Nov. 2008\u2013Nov. 2011, Dec. 2011\u2013Apr. 2013). Shock transmission is no different on days with big spreadchanges and small changes. This is the case even though a significant number of the countries in our sample have been extremely affected by their sovereign debt and fiscal situations. The risk spillover amongthese countries is not affected by the size or sign of the shock, implying that so far contagion has remainedsubdued. However, the US crisis does generate a change in the intensity of the propagation of shocks inthe eurozone between the 2003\u20132006 pre-crisis period and the Nov. 2008\u2013Nov. 2011 post-Lehman one,but the coefficients actually go down, not up! All the increases in correlation we have witnessed overthe last years come from larger shocks and the heteroskedasticity in the data, not from similar shockspropagated with higher intensity across Europe. These surprising, but robust, results emerge becausethis is the first paper, to our knowledge, in which a Bayesian quantile regression approach allowing forheteroskedasticity is used to measure contagion. This methodology is particularly well-suited to dealwith nonlinear and unstable transmission mechanisms especially when asymmetric responses to signand size are suspected

    Predicting the term structure of interest rates incorporating parameter uncertainty, model uncertainty and macroeconomic information

    Get PDF
    We forecast the term structure of U.S. Treasury zero-coupon bond yields by analyzing a range of models that have been used in the literature. We assess the relevance of parameter uncertainty by examining the added value of using Bayesian inference compared to frequentist estimation techniques, and model uncertainty by combining forecasts from individual models. Following current literature we also investigate the benefits of incorporating macroeconomic information in yield curve models. Our results show that adding macroeconomic factors is very beneficial for improving the out-of-sample forecasting performance of individual models. Despite this, the predictive accuracy of models varies over time considerably, irrespective of using the Bayesian or frequentist approach. We show that mitigating model uncertainty by combining forecasts leads to substantial gains in forecasting performance, especially when applying Bayesian model averaging.Term structure of interest rates; Nelson-Siegel model; Affine term structure model; forecast combination; Bayesian analysis

    Forecast densities for economic aggregates from disaggregate ensembles

    No full text
    We extend the “bottom up” approach for forecasting economic aggregates with disaggregates to probability forecasting. Our methodology utilises a linear opinion pool to combine the forecast densities from many disaggregate forecasting specifications, using weights based on the continuous ranked probability score. We also adopt a post-processing step prior to forecast combination. These methods are adapted from the meteorology literature. In our application, we use our approach to forecast US Personal Consumption Expenditure inflation from 1990q1 to 2009q4. Our ensemble combining the evidence from 16 disaggregate PCE series outperforms an integrated moving average specification for aggregate inflation in terms of density forecasting.We thank the ARC, Norges Bank, the Reserve Bank of Australia and the Reserve Bank of New Zealand for supporting this research (LP 0991098)

    Relationship between the forces applied to the starting blocks and block clearance in a sprint start

    Get PDF
    markdownabstract__Abstract__ We introduce a Combined Density Nowcasting (CDN) approach to Dynamic Factor Models (DFM) that in a coherent way accounts for time-varying uncertainty of several model and data features in order to provide more accurate and complete density nowcasts. The combination weights are latent random variables that depend on past nowcasting performance and other learning mechanisms. The combined density scheme is incorporated in a Bayesian Sequential Monte Carlo method which re-balances the set of nowcasted densities in each period using updated information on the time-varying weights. Experiments with simulated data show that CDN works particularly well in a situation of early data releases with relatively large data uncertainty and model incompleteness. Empirical results, based on US real-tim

    Are low frequency macroeconomic variables important for high frequency electricity prices?

    Full text link
    We analyse the importance of low frequency hard and soft macroeconomic information, respectively the industrial production index and the manufacturing Purchasing Managers' Index surveys, for forecasting high-frequency daily electricity prices in two of the main European markets, Germany and Italy. We do that by means of mixed-frequency models, introducing a Bayesian approach to reverse unrestricted MIDAS models (RU-MIDAS). Despite the general parsimonious structure of standard MIDAS models, the RU-MIDAS has a large set of parameters when several predictors are considered simultaneously and Bayesian inference is useful for imposing parameter restrictions. We study the forecasting accuracy for different horizons (from 11 day ahead to 2828 days ahead) and by considering different specifications of the models. Results indicate that the macroeconomic low frequency variables are more important for short horizons than for longer horizons. Moreover, accuracy increases by combining hard and soft information, and using only surveys gives less accurate forecasts than using only industrial production data.Comment: This paper has previously circulated with the title: "Forecasting daily electricity prices with monthly macroeconomic variables" (ECB Working paper Series No. 2250

    Forecasting Financial Time Series Using Model Averaging

    Get PDF
    In almost all cases a decision maker cannot identify ex ante the true process. This observation has led researchers to introduce several sources of uncertainty in forecasting exercises. In this context, the research reported in these pages finds an increase of forecasting power of financial time series when parameter uncertainty, model uncertainty and optimal decision making are included. The research contained herein evidences that although the implementation of these techniques is not often straightforward and it depends on the exercise studied, the predictive gains are statistically and economically significant over different applications, such as stock, bond and electricity markets

    Parallel Sequential Monte Carlo for Efficient Density Combination: The DeCo MATLAB Toolbox

    Get PDF
    This paper presents the Matlab package DeCo (Density Combination) which is based on the paper by Billio et al. (2013) where a constructive Bayesian approach is presented for combining predictive densities originating from different models or other sources of information. The combination weights are time-varying and may depend on past predictive forecasting performances and other learning mechanisms. The core algorithm is the function DeCo which applies banks of parallel Sequential Monte Carlo algorithms to filter the time-varying combination weights. The DeCo procedure has been implemented both for standard CPU computing and for Graphical Process Unit (GPU) parallel computing. For the GPU implementation we use the Matlab parallel computing toolbox and show how to use General Purposes GPU computing almost effortless. This GPU implementation comes with a speed up of the execution time up to seventy times compared to a standard CPU Matlab implementation on a multicore CPU. We show the use of the package and the computational gain of the GPU version, through some simulation experiments and empirical application
    • 

    corecore